13 research outputs found

    Quality of experience in telemeetings and videoconferencing: a comprehensive survey

    Get PDF
    Telemeetings such as audiovisual conferences or virtual meetings play an increasingly important role in our professional and private lives. For that reason, system developers and service providers will strive for an optimal experience for the user, while at the same time optimizing technical and financial resources. This leads to the discipline of Quality of Experience (QoE), an active field originating from the telecommunication and multimedia engineering domains, that strives for understanding, measuring, and designing the quality experience with multimedia technology. This paper provides the reader with an entry point to the large and still growing field of QoE of telemeetings, by taking a holistic perspective, considering both technical and non-technical aspects, and by focusing on current and near-future services. Addressing both researchers and practitioners, the paper first provides a comprehensive survey of factors and processes that contribute to the QoE of telemeetings, followed by an overview of relevant state-of-the-art methods for QoE assessment. To embed this knowledge into recent technology developments, the paper continues with an overview of current trends, focusing on the field of eXtended Reality (XR) applications for communication purposes. Given the complexity of telemeeting QoE and the current trends, new challenges for a QoE assessment of telemeetings are identified. To overcome these challenges, the paper presents a novel Profile Template for characterizing telemeetings from the holistic perspective endorsed in this paper

    Synchronous Remote Rendering for VR

    No full text
    Remote rendering for VR is a technology that enables high-quality VR on low-powered devices. This is realized by offloading heavy computation and rendering to high-powered servers that stream VR as video to the clients. This article focuses on one specific issue in remote rendering when imperfect frame timing between client and server may cause recurring frame drops. We propose a system design that executes synchronously and eliminates the aforementioned problem. The design is presented, and an implementation is tested using various networks and hardware. The design cannot drop frames due to synchronization issues but may on the other hand stall if temporal disturbances occur, e.g., due to network delay spikes or loss. However, experiments confirm that such events can remain rare given an appropriate environment. For example, remote rendering on an intranet at 90 fps with a server located approximately 50 km away yielded just 0.002% stalled frames while rendering with extra latency corresponding to the duration of exactly one frame (11.1 ms at 90 fps). In a LAN without extra latency setting, i.e., with latency equal to locally rendered VR, 0.009% stalls were observed while using a wired Ethernet connection and 0.058% stalls when using 5 GHz wireless IEEE 802.11 ac. © 2021 Viktor Kelkkanen et al.open access</p

    Bitrate Requirements of Non-Panoramic VR Remote Rendering

    No full text
    This paper shows the impact of bitrate settings on objective quality measures when streaming non-panoramic remote-rendered Virtual Reality (VR) images. Non-panoramic here refers to the images that are rendered and sent across the network, they only cover the viewport of each eye, respectively. To determine the required bitrate of remote rendering for VR, we use a server that renders a 3D-scene, encodes the resulting images using the NVENC H.264 codec and transmits them to the client across a network. The client decodes the images and displays them in the VR headset. Objective full-reference quality measures are taken by comparing the image before encoding on the server to the same image after it has been decoded on the client. By altering the average bitrate setting of the encoder, we obtain objective quality scores as functions of bitrates. Furthermore, we study the impact of headset rotation speeds, since this will also have a large effect on image quality. We determine an upper and lower bitrate limit based on headset rotation speeds. The lower limit is based on a speed close to the average human peak head-movement speed, 360°s. The upper limit is based on maximal peaks of 1080°s. Depending on the expected rotation speeds of the specific application, we determine that a total of 20--38Mbps should be used at resolution 2160×1200@90,fps, and 22--42Mbps at 2560×1440@60,fps. The recommendations are given with the assumption that the image is split in two and streamed in parallel, since this is how the tested prototype operates.open access</p

    Remapping of hidden area mesh pixels for codec speed-up in remote VR

    No full text
    Rendering VR-content generally requires large image resolutions. This is both due to the display being positioned close to the eyes of the user and to the super-sampling typically used in VR. Due to the requirements of low latency and large resolutions in VR, remote rendering can be difficult to support at sufficient speeds in this medium.In this paper, we propose a method that can reduce the required resolution of non-panoramic VR images from a codec perspective. Because VR images are viewed close-up from within a headset with specific lenses, there are regions of the images that will remain unseen by the user. This unseen area is referred to as the Hidden-Area Mesh (HAM) and makes up 19% of the screen on the HTC Vive VR headset as one example. By remapping the image in a specific manner, we can cut out the HAM, reduce the resolution by the size of the mesh and thus reduce the amount of data that needs to be processed by encoder and decoder. Results from a prototype remote renderer show that by using the proposed Hidden-Area Mesh Remapping (HAMR), an implementation-dependent speed-up of 10-13% in encoding, 17-18% in decoding and 7-11% in total can be achieved while the negative impact on objective image quality in terms of SSIM and VMAF remains small. © 2021 IEEE.open access</p

    Hand-Controller Latency and Aiming Accuracy in 6-DOF VR

    No full text
    All virtual reality (VR) systems have some inherent hand-controller latency even when operated locally. In remotely rendered VR, additional latency may be added due to the remote transmission of data, commonly conducted through shared low-capacity channels. Increased latency will negatively affect the performance of the human VR operator, but the level of detriment depends on the given task. This work quantifies the relations between aiming accuracy and hand-controller latency, virtual target speed, and the predictability of the target motion. The tested context involves a target that changes direction multiple times while moving in straight lines. The main conclusions are, given the tested context, first, that the predictability of target motion becomes significantly more important as latency and target speed increase. A significant difference in accuracy is generally observed at latencies beyond approximately 130 ms and at target speeds beyond approximately 3.5 degrees/s. Second, latency starts to significantly impact accuracy at roughly 90 ms and approximately 3.5 degrees/s if the target motion cannot be predicted. If it can, the numbers are approximately 130 ms and 12.7 degrees/s. Finally, reaction times are on average 190-200 ms when the target motion changes to a new and unpredictable direction

    VRstalls : A Dataset on the QoE of Frame Stalls in 6-DOF VR

    No full text
    A dataset has been created from an experiment regarding the subjective Quality of Experience (QoE) in relation to stalling video feeds in interactive 6-Degrees-of-freedom Virtual Reality (6-DOF VR). The main purpose of collecting this data is to provide insight into the QoE of stalls that may occur in interactive remote-rendered 6-DOF VR. The data contains information regarding user ratings coupled with various stall lengths and for both frozen or reprojected images. Additionally, user motion and pixel changes (in the form of Mean Square Errors) in successive images in the vicinity of the stall are also stored. In summary, 29 users participated in testing, each rating 16 scenes, amounting to a total of 464 rated scenes. © 2022 IEEE
    corecore